161 research outputs found
Stability and sensitivity of Learning Analytics based prediction models
Learning analytics seek to enhance the learning processes through systematic measurements of learning related data and to provide informative feedback to learners and educators. Track data from Learning Management Systems (LMS) constitute a main data source for learning analytics. This empirical contribution provides an application of Buckingham Shum and Deakin Crickâs theoretical framework of dispositional learning analytics: an infrastructure that combines learning dispositions data with data extracted from computer-assisted, formative assessments and LMSs. In two cohorts of a large introductory quantitative methods module, 2049 students were enrolled in a module based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials. We investigated the predictive power of learning dispositions, outcomes of continuous formative assessments and other system generated data in modelling student performance and their potential to generate informative feedback. Using a dynamic, longitudinal perspective, computer-assisted formative assessments seem to be the best predictor for detecting underperforming students and academic performance, while basic LMS data did not substantially predict learning. If timely feedback is crucial, both use-intensity related track data from e-tutorial systems, and learning dispositions, are valuable sources for feedback generation
The Potential for Student Performance Prediction in Small Cohorts with Minimal Available Attributes
The measurement of student performance during their progress through university study provides academic leadership with critical information on each studentâs likelihood of success. Academics have traditionally used their interactions with individual students through class activities and interim assessments to identify those âat riskâ of failure/withdrawal. However, modern university environments, offering easy on-line availability of course material, may see reduced lecture/tutorial attendance, making such identification more challenging. Modern data mining and machine learning techniques provide increasingly accurate predictions of student examination assessment marks, although these approaches have focussed upon large student populations and wide ranges of data attributes per student. However, many university modules comprise relatively small student cohorts, with institutional protocols limiting the student attributes available for analysis. It appears that very little research attention has been devoted to this area of analysis and prediction. We describe an experiment conducted on a final-year university module student cohort of 23, where individual student data are limited to lecture/tutorial attendance, virtual learning environment accesses and intermediate assessments. We found potential for predicting individual student interim and final assessment marks in small student cohorts with very limited attributes and that these predictions could be useful to support module leaders in identifying students potentially âat risk.â.Peer reviewe
Understanding academicsâ resistance towards (online) student evaluation
Many higher educational institutions and academic staff are still sceptical about the validity and reliability of student evaluation questionnaires, in particular when these evaluations are completed online. One month after a university-wide implementation from paper to online evaluation across 629 modules, (perceived) resistance and ambivalence amongst academic staff were unpacked. A mixed-method study was conducted amongst 104 academics using survey methods and follow-up semi-structured interviews. Despite a successful âtechnicalâ transition (i.e. response rate of 60%, similar scores to previous evaluations), more than half of respondents reported a negative experience with this transition. The results indicate that the multidimensional nature of ambivalence towards change and the dual nature of student evaluations can influence the effectiveness of organisational transition processes
Eliciting students' preferences for the use of their data for learning analytics. A crowdsourcing approach.
Research on student perspectives of learning analytics suggests that students are generally unaware of the collection and use of their data by their learning institutions, and they are often not involved in decisions about whether and how their data are used. To determine the influence of risks and benefits awareness on studentsâ data use preferences for learning analytics, we designed two interventions: one describing the possible privacy risks of data use for learning analytics and the second describing the possible benefits. These interventions were distributed amongst 447 participants recruited using a crowdsourcing platform. Participants were randomly assigned to one of three experimental groups â risks, benefits, and risks and benefits â and received the corresponding intervention(s). Participants in the control group received a learning analytics dashboard (as did participants in the experimental conditions). Participantsâ indicated the motivation for their data use preferences. Chapter 11 will discuss the implications of our findings in relation to how to better support learning institutions in being more transparent with students about the practice of learning analytics
Eliciting students' preferences for the use of their data for learning analytics. A crowdsourcing approach.
Research on student perspectives of learning analytics suggests that students are generally unaware of the collection and use of their data by their learning institutions, and they are often not involved in decisions about whether and how their data are used. To determine the influence of risks and benefits awareness on studentsâ data use preferences for learning analytics, we designed two interventions: one describing the possible privacy risks of data use for learning analytics and the second describing the possible benefits. These interventions were distributed amongst 447 participants recruited using a crowdsourcing platform. Participants were randomly assigned to one of three experimental groups â risks, benefits, and risks and benefits â and received the corresponding intervention(s). Participants in the control group received a learning analytics dashboard (as did participants in the experimental conditions). Participantsâ indicated the motivation for their data use preferences. Chapter 11 will discuss the implications of our findings in relation to how to better support learning institutions in being more transparent with students about the practice of learning analytics
Recommended from our members
Supporting self-regulated learning in a blended learning environment using prompts and learning analytics
Higher education institutions, teachers, and students face new difficulties and opportunities resulting from the introduction of modern technology into the learning process. The widespread of learning environments that integrate online learning and face-to-face learning may pose some opportunities as well as difficulties for some groups of students' self-regulation skills. Providing automated prompts may help to support those students with insufficient self-regulation skills. The use of learning analytics and multiple methods and data sources (data triangulation) may give better insight into the self-regulation process. The objective of the proposed research is to explore the students' evaluation of the usefulness of prompts implemented in a blended learning environment. A secondary objective is to develop and evaluate a real-time dashboard designed to notify teachers of student responses to deployed prompts. The research methodology will be grounded in action research and empirical research. The scientific contribution will be achieved through the development of artefacts and the performance of empirical research to advance understanding of the student's self-regulation in a blended learning environment
Recommended from our members
Innovating Pedagogy 2015: Open University Innovation Report 4
This series of reports explores new forms of teaching, learning and assessment for an interactive world, to guide teachers and policy makers in productive innovation. This fourth report proposes ten innovations that are already in currency but have not yet had a profound influence on education. To produce it, a group of academics at the Institute of Educational Technology in The Open University collaborated with researchers from the Center for Technology in Learning at SRI International. We proposed a long list of new educational terms, theories, and practices. We then pared these down to ten that have the potential to provoke major shifts in educational practice, particularly in post-school education. Lastly, we drew on published and unpublished writings to compile the ten sketches of new pedagogies that might transform education. These are summarised below in an approximate order of immediacy and timescale to widespread implementation
Analytics in online and offline language learning environments: the role of learning design to understand student online engagement
Language education has a rich history of research and scholarship focusing on the effectiveness of learning activities and the impact these have on student behaviour and outcomes. One of the basic assumptions in foreign language pedagogy and CALL in particular is that learners want to be able to communicate effectively with native speakers of their chosen language. Combining principles of learning analytics and Big Data with learning design, this study used a student activity based taxonomy adopted by the Open University UK to inform module design. The learning designs of four introductory and intermediary language education modules and online engagement of 2111 learners were contrasted using weekly learning design data. In this study, we aimed to explore how learning design decisions made by language teachers influenced studentsâ engagement in the VLE. Using fixed effect models, our findings indicated that 55% of variance of weekly online engagement in these four modules was explained by the way language teachers designed weekly learning design activities. Our learning analytics study highlights the potential affordances for CALL researchers to use the power of learning design and big data to explore and understand the complexities and dynamics of language learning for students and teachers
Introducing innovative technologies in higher education: An experience in using geographic information systems for the teachingâlearning process
In today's world, new technologies are being used for the teachingâlearning process in the classroom. Their use to support learning can provide significant advantages for the teachingâlearning process and have potential benefits for students, as many of these technologies are a part of the work life of many current professions. The aim of this study is to analyse the use of innovative technologies for engineering and science education after examining the data obtained from students in their learning process and experiences. The study has been focused on computational geographic information systems, which allow access to and management of large volumes of information and data, and on the assessment of this tool as a basis for a suitable methodology to enhance the teachingâlearning process, taking into account the great social impact of big data. The results allow identifying the main advantages, opportunities, and drawbacks of using these technological tools for educational purposes. Finally, a set of initiatives has been proposed to complement the teaching activity and to improve user experience in the educational field.This study was supported by the Spanish Research Agency and the European Regional Development Fund under project CloudDriver4Industry TIN2017â89266âR
- âŠ